A Penalized Maximum Likelihood Approach to Sparse Factor Analysis

نویسندگان

  • Jang Choi
  • Hui Zou
  • Gary Oehlert
چکیده

Factor analysis is a popular multivariate analysis method which is used to describe observed variables as linear combinations of hidden factors. In applications one usually needs to rotate the estimated factor loading matrix in order to obtain a more understandable model. In this article, an l1 penalization method is introduced for performing sparse factor analysis in which factor loadings naturally adopt a sparse representation, greatly facilitating the interpretation of the fitted factor model. A generalized expectation-maximization algorithm is developed for computing the l1 penalized estimator. Efficacy of the proposed methodology and algorithm is demonstrated by simulated and real data.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Avoiding infinite estimates in logistic regression – theory, solutions, examples

In logistic regression analyses of small or sparse data sets, results obtained by maximum likelihood methods cannot be generally trusted. In such analyses, although the likelihood meets the convergence criterion, at least one parameter may diverge to plus or minus infinity. This situation has been termed ’separation’. Examples of two studies are given, where the phenomenon of separation occurre...

متن کامل

A penalized simulated maximum likelihood approach in parameter estimation for stochastic differential equations

We consider the problem of estimating parameters of stochastic differential equations (SDEs) with discrete-time observations that are either completely or partially observed. The transition density between two observations is generally unknown. We propose an importance sampling approach with an auxiliary parameter which improves approximation of the transition density. We embed the auxiliary im...

متن کامل

A comparative investigation of methods for logistic regression with separated or nearly separated data.

In logistic regression analysis of small or sparse data sets, results obtained by classical maximum likelihood methods cannot be generally trusted. In such analyses it may even happen that the likelihood meets the convergence criteria while at least one parameter estimate diverges to +/-infinity. This situation has been termed 'separation', and it typically occurs whenever no events are observe...

متن کامل

High dimensional Sparse Gaussian Graphical Mixture Model

This paper considers the problem of networks reconstruction from heterogeneous data using a Gaussian Graphical Mixture Model (GGMM). It is well known that parameter estimation in this context is challenging due to large numbers of variables coupled with the degenerate nature of the likelihood. We propose as a solution a penalized maximum likelihood technique by imposing an l1 penalty on the pre...

متن کامل

Sparse Precision Matrix Estimation via Lasso Penalized D-Trace Loss

We introduce a constrained empirical loss minimization framework for estimating highdimensional sparse precision matrices and propose a new loss function, called the D-trace loss, for that purpose. A novel sparse precision matrix estimator is defined as the minimizer of the lasso penalized D-trace loss under a positive-definiteness constraint. Under a new irrepresentability condition, the lasso...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011